Skip to content

revert: "fix(provider): restore parameter transparency in core LLM provider adapters"#7023

Merged
Soulter merged 1 commit intomasterfrom
revert-6934-fix/provider-kwargs
Mar 26, 2026
Merged

revert: "fix(provider): restore parameter transparency in core LLM provider adapters"#7023
Soulter merged 1 commit intomasterfrom
revert-6934-fix/provider-kwargs

Conversation

@Soulter
Copy link
Copy Markdown
Member

@Soulter Soulter commented Mar 26, 2026

Reverts #6934

Summary by Sourcery

Enhancements:

  • Limit Anthropic, Gemini, and OpenAI chat payloads back to explicit message and model fields instead of transparently passing through extra parameters.

@auto-assign auto-assign bot requested review from Raven95676 and anka-afk March 26, 2026 17:57
@dosubot dosubot bot added the size:S This PR changes 10-29 lines, ignoring generated files. label Mar 26, 2026
@Soulter Soulter merged commit 045be79 into master Mar 26, 2026
6 checks passed
@Soulter Soulter deleted the revert-6934-fix/provider-kwargs branch March 26, 2026 17:58
Copy link
Copy Markdown
Contributor

@sourcery-ai sourcery-ai bot left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Hey - I've left some high level feedback:

  • Now that kwargs are no longer forwarded into payloads for Anthropic, Gemini, and OpenAI, consider removing or tightening the **kwargs parameter in these methods (or explicitly documenting the accepted options) to avoid confusion and silent no-ops for callers passing extra arguments.
Prompt for AI Agents
Please address the comments from this code review:

## Overall Comments
- Now that `kwargs` are no longer forwarded into `payloads` for Anthropic, Gemini, and OpenAI, consider removing or tightening the `**kwargs` parameter in these methods (or explicitly documenting the accepted options) to avoid confusion and silent no-ops for callers passing extra arguments.

Sourcery is free for open source - if you like our reviews please consider sharing them ✨
Help me be more useful! Please click 👍 or 👎 on each comment and I'll use the feedback to improve your reviews.

@dosubot dosubot bot added the area:provider The bug / feature is about AI Provider, Models, LLM Agent, LLM Agent Runner. label Mar 26, 2026
Copy link
Copy Markdown
Contributor

@gemini-code-assist gemini-code-assist bot left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Code Review

This pull request removes the inclusion of **kwargs in the API payloads for Anthropic, Gemini, and OpenAI providers. Feedback indicates that this change is a functional regression as it breaks parameter transparency, preventing users from passing provider-specific arguments to the underlying APIs.

model = model or self.get_model()

payloads = {**kwargs, "messages": new_messages, "model": model}
payloads = {"messages": new_messages, "model": model}
Copy link
Copy Markdown
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

high

This change removes the **kwargs from the payloads dictionary. The text_chat method is designed to accept arbitrary keyword arguments (**kwargs) to allow for parameter transparency, enabling users to pass provider-specific parameters directly to the underlying API. Removing **kwargs here prevents this flexibility and can lead to a loss of functionality or extensibility for users who rely on passing additional parameters to the Anthropic API. This reverts the intended behavior of restoring parameter transparency.

payloads = {**kwargs, "messages": new_messages, "model": model}

model = model or self.get_model()

payloads = {**kwargs, "messages": new_messages, "model": model}
payloads = {"messages": new_messages, "model": model}
Copy link
Copy Markdown
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

high

Similar to the text_chat method, removing **kwargs from the payloads in text_chat_stream eliminates the ability to pass provider-specific parameters to the Anthropic API. This reduces the flexibility and extensibility of the API, which is a functional regression.

payloads = {**kwargs, "messages": new_messages, "model": model}

model = model or self.get_model()

payloads = {**kwargs, "messages": context_query, "model": model}
payloads = {"messages": context_query, "model": model}
Copy link
Copy Markdown
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

high

The removal of **kwargs from the payloads dictionary in text_chat for the Gemini provider restricts the ability to pass through additional, provider-specific parameters. This limits the extensibility of the API and can prevent users from utilizing all available features of the Gemini API that are not explicitly defined in the method signature.

payloads = {**kwargs, "messages": context_query, "model": model}

model = model or self.get_model()

payloads = {**kwargs, "messages": context_query, "model": model}
payloads = {"messages": context_query, "model": model}
Copy link
Copy Markdown
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

high

Removing **kwargs from the payloads in text_chat_stream for the Gemini provider removes parameter transparency. This prevents users from passing custom or provider-specific arguments to the underlying API, which is a functional regression and limits the API's flexibility.

payloads = {**kwargs, "messages": context_query, "model": model}

model = model or self.get_model()
payloads = {**kwargs, "messages": context_query, "model": model}

payloads = {"messages": context_query, "model": model}
Copy link
Copy Markdown
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

high

The _prepare_chat_payload function is intended to gather all necessary parameters, including **kwargs, for the OpenAI API call. Removing **kwargs from the payloads dictionary here means that any extra parameters passed to the text_chat or text_chat_stream methods will not be forwarded to the OpenAI API. This significantly reduces the flexibility and extensibility of the provider, as users will be unable to leverage additional API features or custom settings supported by OpenAI but not explicitly exposed in the method signature.

payloads = {**kwargs, "messages": context_query, "model": model}

@Soulter Soulter mentioned this pull request Mar 27, 2026
2 tasks
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

area:provider The bug / feature is about AI Provider, Models, LLM Agent, LLM Agent Runner. size:S This PR changes 10-29 lines, ignoring generated files.

Projects

None yet

Development

Successfully merging this pull request may close these issues.

1 participant